Fix: Prevent UnionTransformer type ambiguity in combination with PyTorchTypeTransformer #2726
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Why are the changes needed?
This example fails with:
The type transformer choice should, however, not be ambiguous.
The problem arises as follows:
Here in the
UnionTransformer
we test all union variants and raise the observed error if more than one of the union variants work.In this case, the two union working variants are:
ListTransformer
for a list of tensors and then thePyTorchTensorTransformer
for each element in the list. This is the expected behaviour and does work.PyTorchTensorTransformer
for the list of tensors. This, however, works as well astorch.save([some_tensor], ...)
works. This is not intended and hence causes the ambiguity.What changes were proposed in this pull request?
We need to prevent the pytorch type transformers from serializing lists of tensors or modules since the type engine (i.e. the
UnionTransformer
expects these cases to be handled by theListTransformer
).How was this patch tested?
Tested with the snippet above and added a unit test that would have caught this.